There Are Four Lights: the EARN IT Act Is Back and Still Mathematically Incoherent
from Net Politics, Digital and Cyberspace Policy Program, and Renewing America
from Net Politics, Digital and Cyberspace Policy Program, and Renewing America

There Are Four Lights: the EARN IT Act Is Back and Still Mathematically Incoherent

A surveillance camera is seen near a Chinese flag in Shanghai, China.
A surveillance camera is seen near a Chinese flag in Shanghai, China. Aly Song/Reuters

The EARN IT Act is back for a third time. The current version purports to both maintain privacy and protect children, but this is a false dichotomy; the act would expand state power and decrease users' privacy.

June 8, 2023 11:09 am (EST)

A surveillance camera is seen near a Chinese flag in Shanghai, China.
A surveillance camera is seen near a Chinese flag in Shanghai, China. Aly Song/Reuters
Post
Blog posts represent the views of CFR fellows and staff and not those of CFR, which takes no institutional positions.

Politics gets done based on compromise and deals. A life in politics teaches the art of the partial achievement. When politics bumps up against a universal constant, or a truth of math and physics, often, politicians simply cannot grok that there are things they can’t change, can’t bargain against, can’t shift from the bedrock of their truth. One of those things is the world’s most powerful weapon and shield on the internet: cryptography. The continued appearance of the zombie bill, the EARN IT Act demonstrates time and time again that technologists are failing to explain and policymakers are failing to understand that there is no such thing as partially-broken encryption.

The EARN IT Act, a bipartisan bill intended to penalize companies for not spying on their customers by breaking their end-to-end-encryption, was recently reintroduced for the third time several weeks ago. The act would, among other things, create an exception in Section 230 of the Communications Decency Act that would expose companies to additional legal liability for hosting child sexual abuse material (CSAM). This seems like a good thing to do, but the incentives to not hold CSAM already exist in law (and in morality). Knowingly holding CSAM is already like holding a bag of fentanyl; making it extra-special, double cross-our-hearts illegal has no meaning. It’s already very much illegal for companies to knowingly host CSAM, and many companies currently try to detect it proactively while respecting the privacy of their users.

More on:

Cybersecurity

Privacy

The bill’s sponsors, however, think current efforts are not enough—in part because standard CSAM detection methods don’t work in end-to-end encrypted settings. The authors of the EARN IT Act have said that the goal of the act is to encourage companies to greatly increase their scanning of users’ files and communications. EARN IT forces companies to spy on their users by removing Section 230 protections from companies that even unknowingly are hosting illegal material, and encryption is part of that lack of knowledge. In information security and technology, that lack of knowledge is a good thing; it’s the privacy of anyone who uses online services to do anything at all, and arguably is the reason for the modern internet’s success. To achieve EARN IT’s goals (and avoid liability exposure) in practice, companies would need to break encryption and greatly expand their surveillance of American citizens, while turning over that material on demand to U.S. law enforcement.

If this bill was pointed at landlords and tenants, it would be the digital equivalent of requiring landlords to constantly search their unknowing and unconsenting tenants’ homes for any evidence of child sex crimes. It’s already illegal for landlords to knowingly permit crimes—but demanding they go search their tenants’ homes repeatedly to turn over evidence becomes distressingly dystopic and forces them into becoming unwilling agents of the state instead of private citizens or companies. (Which is unconstitutional, by the way: if a law dragoons a digital landlord into being an agent of the state, its warrantless searches of the tenant’s online home violate the Fourth Amendment.) Continuing the brick-and-mortar metaphor, EARN IT’s authors want landlords to replace tenants’ strong locks with ones that open whenever anyone wants to poke about looking for child sex crimes whether there’s any reason to do so or not. And they swear that privacy and security for those tenants will not be meaningfully affected.

That, of course, is nonsense. It is not mathematically possible, as lawmakers repeatedly demand, to only break encryption a little bit in order to search for CSAM. Once encryption is broken, it is all the way broken. Here’s an analogy to illustrate. Remember 1984 by George Orwell? There’s a moment when Winston Smith sees O’Brien holding up four fingers, and is told that if he only says there are five fingers up, he can go free. For those of you who remember the incredible performances of Sir Patrick Stewart and David Warner in Star Trek: The Next Generation’s retelling of Orwell’s story in the brilliant two-part episode Chain Of Command, Captain Picard is being told by his Cardassian captor that he must say that there are five lights in the room, though there are only four. What if O’Brien or Gul Madred, in that situation, had been willing to compromise, as the U.S. government often says they’re willing to do? All Winston had to do was say there were 4.5 fingers up. The only thing Picard needed to do was say there were 4.5 lights. That’s meeting their opponents halfway, right? It may be a compromise, but it’s neither a meaningful one, nor a true one. It’s not physically possible for there to be 4.5 fingers up or 4.5 lights; there either is or is not an extra finger or one more light, and saying anything else may stop the torture but it doesn’t make it true. We cannot tell policymakers that it is possible to compromise on encryption, because there’s no compromise to be had. It’s either broken or not. There are four fingers. THERE ARE FOUR LIGHTS. Saying there are 5 because that’s what a policymaker wants to hear, or saying there are 4.5 because it seems like a compromise, makes us liars without solving any of the underlying problems.

Privacy and protection are features of technologies and of the relationship between parent and child, or state and citizen. But encryption is not a feature of a relationship or a policy tradeoff. It’s a fundamental mathematical principle that is unrelated to our wishes and desires. Famously broken encryption algorithms like DES and SHA-1 were widely used until they were cryptographically cracked, even though a lot of people found it mightily inconvenient to update their encryption protocols. Most experts think that a cryptographic random number algorithm sold by the company RSA was deliberately backdoored. Once any encryption algorithm is broken, it can be as little as a few minutes until the vulnerability is known worldwide.

I’ve written about how the widely-derided Montana TikTok ban is only enforceable with the creation of a Chinese-style surveillance state. EARN IT forces companies to participate in that Chinese-style surveillance state in defiance of constitutional principles. The Chinese government has that level of access to its citizens’ data because it holds the encryption keys to nearly all consumer software and Internet products in China—and two too many U.S. senators (Blumenthal and Graham) are stubbornly trying to replicate the same surveillance state with EARN IT under the banner of “think of the children.”

More on:

Cybersecurity

Privacy

One of the most fundamental choices we all make as U.S. citizens is how much protection we want versus how much privacy to make our own unsupervised choices. That’s not a choice available to Chinese citizens. We can make tradeoffs between privacy and protection. Many parents do this for their kids, by buying them a cell phone when young—mostly to track their location, supervise their communications to protect them, and give them a way to call for help if needed. As kids grow, parents will ideally scale back their supervision and protection of their child’s phone as the kid becomes more worthy of trust and demonstrates that they’re capable of making good decisions—with the right and in fact the legal responsibility to look in if they become concerned. But simply forcing companies to begin surveilling all citizens while swearing to only the best intentions doesn’t work. While being interviewed by Tom Brokaw in a 2006 discussion at the Council on Foreign Relations, Michael Chertoff said it beautifully:

There’s a tremendous desire to have the government tell people that we will in fact protect them against every risk in every place at every moment.  And I will tell you that we cannot do that, and we will not do it.  The price to do that would be to convert our society, which is a free and open society, into a bankrupt police state.

What we have to do is, we have to intelligently and honestly assess the trade-offs.  We have to understand what are the benefits, what are the risks and what are the costs.  And then we have to have an open and honest discussion about what are the costs we’re prepared to bear in order to achieve a reasonable but not a perfect level of security.

Traditionally, when we have had giant piles of data on potential crimes, law enforcement has never stopped at searching that data pile for only one kind of crime, and indeed, will justify illegal uses of that data pile to catch criminals. They’ll use novel readings of laws to push new uses for data they collected for other reasons or attempt subpoenas for that data, including for chilling investigative journalism. They will always find a reason to search for the second and third and dozenth kind of crime, because the data is right there to be found and used. It’s part of the reason we have a Fourth Amendment right against illegal search and seizure. The founders of this country knew well that once mass surveillance and law enforcement overreach gets a toehold, it will not stop, and worse, can lead to a vicious self-censorship and data-collection spiral that ruins our freedom of expression beyond repair.

I can sympathize with and understand the deep anger of law enforcement officials who are unable to access an existing data pile that could help them solve devastating crimes against the most vulnerable members of our society. What I cannot tell is if there’s any end to the amount of data those law enforcement officers want to hold on all of us. To be perfectly accurate, the best data pile for those law enforcement officers to use to solve all crimes would be every conversation via voice or text that all of us have ever had. Weighing the tradeoffs between privacy versus protection is hard, and is something we as a society must continue to grapple with in the future. It’s easy to go too far or not far enough in moving towards one or the other, but, basically giving someone more privacy does mean surveilling them less. However, there’s a difference between privacy and protection, which are values and rights, and hence can be balanced with each other, and immutable truths like how math works.

Most depressingly, the first, second, and now third EARN IT is bipartisan bad math—there are Republican and Democratic supporters of EARN IT—again reflective of the fact that policymakers desperately want to use an easier downstream symptomatic penalty to address a more complex upstream social problem capable of being mitigated and prevented. EARN IT is yet another case of politicians not understanding or ignoring the technical realities behind encryption. You can’t be a little bit pregnant, you can’t be kind of dead, and encryption can’t be halfway broken.

Creative Commons
Creative Commons: Some rights reserved.
Close
This work is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) License.
View License Detail
Close